PLLuM-12B-chat is an optimized dialogue version with 12 billion parameters in the Polish large language model family. It is specifically designed for the Polish language and Slavic/Baltic languages, achieving safe and efficient interaction capabilities through instruction fine-tuning and preference learning.
Large Language Model
Transformers